Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Blockchain-based decentralized attribute-based encryption scheme for revocable attributes
Haiying MA, Jinzhou LI, Jikun YANG
Journal of Computer Applications    2023, 43 (9): 2789-2797.   DOI: 10.11772/j.issn.1001-9081.2023020138
Abstract219)   HTML13)    PDF (952KB)(203)       Save

For the problems of existing Attribute-Based Encryption (ABE) schemes, such as low efficiency of attribute revocation and difficulty in coordinating the distribution and revocation of user attribute keys, a Blockchain-based Decentralized Attribute-Based Encryption for Revocable attributes (BRDABE) scheme was proposed. Firstly, the consensus-driven blockchain architecture was used to map the trust issue of key distribution from the attribute authority to the distributed ledger, and smart contracts were used to record the status of user attributes and data sharing and assist the attribute authority to realize the user attribute revocation. When revoking a user’s attribute, the smart contracts were used by the attribute authority to automatically screen out the involved data owners and non-revoked authorized users and computed the ciphertext update key and key update key related to the revoked attribute, and the off-chain ciphertext and key update was realized. Then, the version key and the user’s global identity were embedded in the attribute private key, so that the identities in the session key ciphertext and the user’s attribute private key were able to cancel each other out when the user decrypted. Based on reasonable assumptions, BRDABE scheme was proved to resist the collusion attack of users and satisfy the forward and backward security of user attribute revocation. Experimental results show that with the increase of the number of user attributes, the time of user key generation, encryption and decryption and attribute revocation increase linearly. In the case of the same number of attributes, compared with DABE (Decentralizing Attribute-Based Encryption) scheme BRDABE scheme has the decryption time reduced by 94.06% to 94.75%, and compared with EDAC-MCSS (Effective Data Access Control for Multiauthority Cloud Storage Systems) scheme, BRDABE scheme has the attribute revocation time reduced by 92.19% to 92.27%. Therefore, BRDABE scheme not only improves the efficiency of attribute revocation, but also guarantees the forward and backward security of shared data.

Table and Figures | Reference | Related Articles | Metrics
Comparative density peaks clustering algorithm with automatic determination of clustering center
GUO Jia, HAN Litao, SUN Xianlong, ZHOU Lijuan
Journal of Computer Applications    2021, 41 (3): 738-744.   DOI: 10.11772/j.issn.1001-9081.2020071071
Abstract519)      PDF (2809KB)(547)       Save
In order to solve the problem that the clustering centers cannot be determined automatically by Density Peaks Clustering (DPC) algorithm, and the clustering center points and the non-clustering center points are not obvious enough in the decision graph, Comparative density Peaks Clustering algorithm with Automatic determination of clustering center (ACPC) was designed. Firstly, the distance parameter was replaced by the distance comparison quantity, so that the potential clustering centers were more obvious in the decision graph. Then, the 2D interval estimation method was used to perform the automatic selection of clustering centers, so as to realize the automation of clustering process. Experimental results show that the ACPC algorithm has better clustering effect on four synthetic datasets; and the comparison of the Accuracy indicator on real datasets shows that on the dataset Iris, the clustering accuracy of ACPC can reach 94%, which is 27.3% higher than that of the traditional DPC algorithm, and the problem of selecting clustering centers interactively is solved by ACPC.
Reference | Related Articles | Metrics
Response obfuscation based secure deduplication method for cloud data with resistance against appending chunk attack
TANG Xin, ZHOU Linna
Journal of Computer Applications    2020, 40 (4): 1085-1090.   DOI: 10.11772/j.issn.1001-9081.2019081468
Abstract533)      PDF (901KB)(304)       Save
Appending chunk attack is an important attack to threaten the security of cross-user deduplication for cloud data,which works by appending a random number of non-duplicate chunks to the file to be detected,making it impossible for cloud service providers to determine the true existence of the file. Therefore,the existence privacy of cloud data cannot be protected by general ways of response obfuscation methods. To deal with this problem,a new response obfuscation based secure deduplication method with resistance against appending chunk attack was proposed. By calculating the number of appending chunks,counting the number of non-duplicate chunks and comparing these two to determine the minimum number of redundant chunks involved in the response,so as to achieve the obfuscation. As a result,the existence of the checking file was not able to be judged by the attacker according to the response with little extra communication overhead. Security analysis and experimental results show that,compared with the state-of-the-art in this field,the proposed method achieves higher level of security with smaller amount of overhead required,or improves security significantly with comparable or slightly increased overhead.
Reference | Related Articles | Metrics
Outlier node detection algorithm in wireless sensor networks based ongraph signal processing
LU Guangyue, ZHOU Liang, LYU Shaoqing, SHI Cong, SU Keke
Journal of Computer Applications    2020, 40 (3): 783-787.   DOI: 10.11772/j.issn.1001-9081.2019071224
Abstract452)      PDF (785KB)(330)       Save
Since the low security of sensors, poor detection area and resource limitation in Wireless Sensor Network (WSN) cause outlier data collected by nodes, an algorithm of the outlier node detection in WSN based on graph signal processing was proposed. Firstly, according to the sensor position features, a K-Nearest Neighbors ( KNN) graph signal model was established. Secondly, the statistical test quantity was built based on the smoothness ratio of the graph signal before and after low-pass filtering. Finally, the judgement of the existence of outlier nodes was realized through the statistical test quantity and decision threshold. Experiments on the public temperature dataset and PM2.5 dataset demonstrate that compared with algorithm of outlier node detection based on graph frequency domain, the proposed algorithm has the detection rate increased by 7% under the condition of single outlier node and has the detection rate of 98% under the condition of multiple outlier nodes, and keep high detection rate under the condition of outlier node with small deviation value.
Reference | Related Articles | Metrics
Credit assessment method based on majority weight minority oversampling technique and random forest
TIAN Chen, ZHOU Lijuan
Journal of Computer Applications    2019, 39 (6): 1707-1712.   DOI: 10.11772/j.issn.1001-9081.2018102180
Abstract483)      PDF (895KB)(316)       Save
In order to solve the problem of unbalanced dataset in credit assessment and the limited classification effect of single classifier on unbalanced data, a Majority Weighted Minority Oversampling TEchnique-Random Forest (MWMOTE-RF) credit assessment method was proposed. Firstly, MWMOTE technology was applied to increase the samples of minority classes in the preprocessing stage. Then, on the preprocessed balanced dataset, random forest algorithm, one of supervised machine learning algorithms, was used to classify and predict the data. With Area Under the Carve (AUC) used to evaluate the performance of classifier, experiments were conducted on German credict card dataset from UCI database and a company's car default loan dataset. The results show that the AUC value of MWMOTE-RF method increases by 18% and 20% respectively compared with random forest method and Naive Bayes method on the same data set. At the same time, random forest method was combined with Synthetic Minority Over-sampling TEchnique (SMOTE) and ADAptive SYNthetic over-sampling (ADASYN), respectively, and the AUC value of MWMOTE-RF method increases by 1.47% and 2.34% respectively compared with them. The results prove the effectiveness and the optimization of classifier performance of the proposed method.
Reference | Related Articles | Metrics
Mechanism of sparse restricted Boltzmann machine based on competitive learning
ZHOU Lijun, LIU Kai, LYU Haiyan
Journal of Computer Applications    2018, 38 (7): 1872-1876.   DOI: 10.11772/j.issn.1001-9081.2018010001
Abstract451)      PDF (816KB)(309)       Save
To resolve the problems of feature homogeneity in unsupervised training of Restricted Boltzmann Machine (RBM) and non-adaptiveness of Sparse Restricted Boltzmann Machine (SRBM), a new sparse mechanism method of RBM based on competitive learning was designed. Firstly, a distance measurement was designed based on the cosine value between the neuron weight vector and the input vector to evaluate the similarity. Secondly, the optimal matching implicit unit based on distance measurement was selected for different samples during training. Thirdly, the sparse penalty for other hidden units was calculated according to the activation state of the optimal matching hidden unit. Finally, the parameters were updated and the competitive sparseness was applied to the construction of Deep Boltzmann Machine (DBM) based on the deep model training process. The handwritten number recognition results show that, compared with the mechanism using the sum of squared errors as the regularization factor, the classification accuracy of DBM based on new sparse mechanism is improved by 0.74%, and the average sparsity measurement is increased by 5.6%, without the need to set sparse parameters. Therefore, the proposed sparse mechanism can improve the training efficiency of unsupervised training model, such as RBM, and can be applied into the construction of deep models.
Reference | Related Articles | Metrics
Secure performance analysis based on opportunity relaying transmission scheme
ZHANG Yongjian, HE Yucheng, ZHOU Lin
Journal of Computer Applications    2018, 38 (10): 2908-2912.   DOI: 10.11772/j.issn.1001-9081.2018030665
Abstract423)      PDF (835KB)(289)       Save
To solve the problem of information being intercepted by illegal users during wireless communication, a secure transmission strategy based on optimal relay selection was proposed. Firstly, the pre-designed artificial noise and useful information were integrated at the source node, and the best relay selection algorithm was used to select the best relay to forward the received information. Secondly, the secrecy capacity, outage probability and interpret probability of the system were derived. Finally, the optimal number of relays were determined by the comprehensive performance of security and reliability. Theoretical analysis and experimental simulation results show that compared with the traditional system model without artificial noise, the performance of the proposed system can be significantly improved by adding relay nodes.
Reference | Related Articles | Metrics
Query performance and data migration for social network database with shard strategy based on clustering analysis
LIANG Shuang, ZHOU Lihua, YANG Peizhong
Journal of Computer Applications    2017, 37 (3): 673-679.   DOI: 10.11772/j.issn.1001-9081.2017.03.673
Abstract596)      PDF (1109KB)(394)       Save
Social network data has a certain degree of aggregation, namely the similar users are more prone to the same behavior. According to the conventional horizontal database shard method, a large amount of time and connection loss were consumed in order to access a plurality of databases in turn when performing the information query of these events. In order to solve this problem, the database shard strategy based on clustering analysis was proposed. Through clustering the characteristic scalars of social network subjects, the main body with the high aggregation was divided into one or as possible libraries to improve the query efficiency of the events, and to give consideration to load balancing, large data migration and other issues. The experimental results show that for the mainstream social networking events, the performance improvement of the proposed strategy is up to 23.4% at most, and local optimal load balance and zero data migration are realized. In general, the database shard strategy based on clustering analysis of social network, has a considerable advantage on improving query efficiency, balance load balancing and large data migration feasibility over the traditional conventional horizontal database shard method of cutting library.
Reference | Related Articles | Metrics
Building-damage detection based on combination of multi-features
LIU Yu, CAO Guo, ZHOU Licun, QU Baozhu
Journal of Computer Applications    2015, 35 (9): 2652-2655.   DOI: 10.11772/j.issn.1001-9081.2015.09.2652
Abstract472)      PDF (828KB)(278)       Save
To detect building-damage areas in post-seismic high-resolution remote sensing images, a building-damage detection method based on multi-features was proposed. Firstly, Morphological Attribute Profile (MAP) and Local Binary Pattern (LBP) operator were used to extract geometric features and texture features. Then, Random Forest (RF) classifier was applied to extract damaged building regions so as to form the preliminary results. At last, for segmented objects, the ultimate building-damage area was obtained by computing the damaged ratio of each object. Experiments were carried out on Yushu post-seismic aerial remote sensing images whose spatial resolution was 0.1 m. Results show that this method improves overall accuracy by 12% compared with Morphological Profile (MP)-based method. The results indicate that the proposed method can effectively detect building-damage areas with high accuracy in post-seismic high-resolution images.
Reference | Related Articles | Metrics
Novel channel-adaptive coded cooperation scheme
QIAO Ying, HE Yucheng, ZHOU Lin
Journal of Computer Applications    2015, 35 (5): 1218-1223.   DOI: 10.11772/j.issn.1001-9081.2015.05.1218
Abstract363)      PDF (871KB)(571)       Save

To overcome the severe performance loss of conventional coded cooperation schemes under dynamic channel conditions in mobility scenarios, a novel adaptive coded cooperation scheme was proposed by using rate-compatible Low-Density Parity Check (LDPC) codes in combination with a Hybrid Automatic Repeat reQuest (HARQ) protocol. It was assumed that channel state information changed during each transmission. By automatic retransmission of unequal length incremental redundancy, the equivalent code rates at the cooperative and destination nodes could be nonlinearly adjusted with channel conditions. The expressions for outage probability and throughput were derived for evaluating the system performance of the proposed scheme, and theoretical analysis and simulation results were presented. These results show that, compared with conventional schemes and equal-length retransmission schemes, the proposed scheme with properly designed compatible rates can effectively reduce the system outage probability, increase the throughput, and improve the transmission reliability of cooperative communications in mobility scenarios.

Reference | Related Articles | Metrics
Multi-label classification algorithm based on joint probability
HE Peng, ZHOU Lijuan
Journal of Computer Applications    2015, 35 (3): 659-662.   DOI: 10.11772/j.issn.1001-9081.2015.03.659
Abstract594)      PDF (673KB)(548)       Save

Since the Multi-Label k Nearest Neighbor (ML-kNN) algorithm ignores the correlation between labels, a multi-label classification algorithm based on joint probability was proposed. Firstly, priori probability was calculated during traversing the sample space; Secondly, conditional probability of a label appeared m times in kNN when it got value 1 or 0 was computed; Then, the method of using label joint probability distribution, which was computed during traversing the sample space, as multi-label classification model was proposed. Finally, the multi-label classification model of coRrelation Multi-Label-kNN (RML-kNN) was deduced by way of maximizing the posterior probability. The theoretical analysis and comparison experiments on several datasets show that RML-kNN elevates Subset Accuracy to 0.9612 in the best case, which gains 2.25% promotion compared with ML-kNN; RML-kNN, which gains significant reduction on Hamming Loss, gets a minimum value of 0.0022; Micro-FMeasure can be elevated up to 0.9767, in comparison of ML-kNN, RML-kNN gets 2.88% elevation in the best case. The experimental results show that RML-kNN outperforms ML-kNN as it integrates correlation between labels during classification process.

Reference | Related Articles | Metrics
Blind separation method for source signals with temporal structure based on second-order statistics
QIU Mengmeng ZHOU Li WANG Lei WU Jianqiang
Journal of Computer Applications    2014, 34 (9): 2510-2513.   DOI: 10.11772/j.issn.1001-9081.2014.09.2510
Abstract196)      PDF (685KB)(510)       Save

The objective of Blind Source Separation (BSS) is to restore the unobservable source signals from their mixtures without knowing the prior knowledge of the mixing process. It is considered that the potential source signals are spatially uncorrelated but temporally correlated, i.e. they have non-vanishing temporal structure. A second-order statistics based BSS method was proposed for such sources. The robust prewhitening was firstly performed on the observed mixing signals, where the dimension of the sources was estimated based on the Minimum Description Length (MDL) criterion. Then, the blind separation was realized by implementing the Singular Value Decomposition (SVD) on the time-delayed covariance matrix of the whitened signals. The simulation on separation of a group of speech signals proves the effectiveness of the algorithm, and the performance of the algorithm was measured by Signal-to-Interference Ratio (SIR) and Performance Index (PI).

Reference | Related Articles | Metrics
Sheep body size measurement based on computer vision
JIANG Jie ZHOU Lina LI Gang
Journal of Computer Applications    2014, 34 (3): 846-850.   DOI: 10.11772/j.issn.1001-9081.2014.03.0846
Abstract720)      PDF (964KB)(495)       Save

Body size parameters are important indicators to evaluate the growth status of sheep. How to achieve the measurement with non-stress instrument is an urgent and important problem that needs to be resolved in the breeding process of sheep. This paper introduced corresponding machine vision methods to measure the parameters. Sheep body in complex environment was detected by gray-based background subtraction method and chromaticity invariance principle. By virtue of grid method, the contour envelope of sheep body was extracted. After analyzing the contour sequence with D-P algorithm and Helen-Qin Jiushao formula, the point with maximum curvature in the contour was acquired. The point was chosen as the measurement point at the hip of sheep. Based on the above information, the other three measurment points were attained using four-point method and combing the spatial resolution, the body size parameters of sheep body were acquired. And the contactless measurement was achieved. The experimental results show that, the proposed method can effectively extract sheep body in complex environment; the measurement point at hip of sheep can be stably determined and the height of sheep can be stably attained. Due to the complexity of the ambient light, there still exits some problems when determining the shoulder points.

Related Articles | Metrics
Shopping information extraction method based on rapid construction of template
LI Ping ZHU Jianbo ZHOU Lixin LIAO Bin
Journal of Computer Applications    2014, 34 (3): 733-737.   DOI: 10.11772/j.issn.1001-9081.2014.03.0733
Abstract401)      PDF (888KB)(750)       Save

Concerning the shopping information Web page constructed by template, and the large number of Web information and complex Web structure, this paper studied how to extract the shopping information from the Web page template by not using the complex learning rule. The paper defined the Web page template and the extraction template of Web page and designed template language that was used to construct the template. This paper also gave a model of extraction based on template. The experimental results show that the recall rate of the proposed method is 12% higher than the Extraction problem Algorithm (EXALG) by testing the standard 450 Web pages; the results also show that the recall rate of this method is 7.4% higher than Visual information and Tag structure based wrapper generator (ViNTs) method and 0.2% higher than Augmenting automatic information extraction with visual perceptions (ViPER) method and the accuracy rate of this method is 5.2% higher than ViNTs method and 0.2% higher than ViPER method by testing the standard 250 Web pages. The recall rate and the accuracy rate of the extraction method based on the rapid construction template are improved a lot which makes the accuracy of the Web page analysis and the recall rate of the information in the shopping information retrieval and the shopping comparison system improve a lot .

Related Articles | Metrics
Algorithm for detecting approximate duplicate records in massive data
ZHOU Dianrui ZHOU Lianying
Journal of Computer Applications    2013, 33 (08): 2208-2211.  
Abstract723)      PDF (673KB)(518)       Save
For the problem of low precision and low time efficiency of approximate duplicate records detection algorithm in massive data, integrated weighted method and filtration method based on the length of strings were adopted to do the approximate duplicate records detection in dataset. Integrated weighted method integrated user experience and mathematical statistics to calculate the weight of each attribute to make weight calculation more scientific. The filtration method based on the length of strings made use of the length difference between strings to terminate the edit distance algorithm earlier which reduced the number of the records to be matched during the detection process. The experimental results show that the weight vector calculated by the integrated weighted method makes the importance of each field more comprehensive and accurate. The filtration method based on the length of strings reduces the comparison time among records and effectively solves the problem of the detection of approximate duplicate records under massive data.
Reference | Related Articles | Metrics
Face recognition with adaptive local-Gabor features based on energy
ZHOU Lijian MA Yanyan SUN Jie
Journal of Computer Applications    2013, 33 (03): 700-703.   DOI: 10.3724/SP.J.1087.2013.00700
Abstract1066)      PDF (653KB)(523)       Save
Concerning the time-consuming and computational complexity in extracting face features of traditional Gabor filters, the face features were extracted by using three different local Gabor filters adaptively chosen by the Gabor images' energy from different directions, scales and overall situation. Firstly, the Gabor features of some images in the face database were extracted and analyzed, and the local Gabor filters were built by choosing the filters corresponding to the images with larger energy. And then, the Fisher features were extracted using Linear Discriminate Analysis (LDA) further. Finally, face recognition was realized using the nearest neighbor method. The experimental results based on ORL and YALE face database show that the proposed approach has better face recognition performance with less feature dimension and calculation time.
Reference | Related Articles | Metrics
Algorithm of generating multi-resolution curves for progressive transmission over the Internet
CAO Zhenzhou LI Manchun CHENG Liang CHEN Zhenjie
Journal of Computer Applications    2013, 33 (03): 688-690.   DOI: 10.3724/SP.J.1087.2013.00688
Abstract882)      PDF (621KB)(420)       Save
Concerning the problems of high time complexity and topological inconsistency existing in the multi-resolution representation of curve for progressive transmission, an algorithm of generating multi-resolution curves for progressive transmission over the Internet was proposed in this paper. By using pre-stored vertex deviation to simplify curves and using an optimized monotone chain intersection algorithm to maintain topological consistency, the algorithm can quickly generate topologically consistent multi-resolution curves. The algorithm was used in the experiment of progressive transmission for curve data, and the experimental results show that the multi-resolution curve data maintain topological consistency and the generation time changes linearly with the amount of data. The effectiveness of the algorithm has been verified in the experiment.
Reference | Related Articles | Metrics
Improved image segmentation algorithm based on GrabCut
ZHOU Liangfen HE Jiannong
Journal of Computer Applications    2013, 33 (01): 49-52.   DOI: 10.3724/SP.J.1087.2013.00049
Abstract1367)      PDF (664KB)(943)       Save
To solve the problem that GrabCut algorithm is sensitive to local noise, time consuming and edge extraction is not ideal, the paper put forward a new algorithm of improving image segmentation based on GrabCut. Multi-scale watershed was used for gradient image smoothing and denoising. Watershed operation was proposed again for the new gradient image, which not only enhanced image edge points, but also reduced the computation cost of the subsequent processing. Then the entropy penalty factor was used to optimize the segmentation energy function to prevent target information loss. The experimental results show that the error rate of the proposed algorithm is reduced, Kappa coefficient is increased and the efficiency is improved compared with the traditional algorithm. In addition, the edge extraction is more complete and smooth. The improved algorithm is applicable to different types of image segmentation.
Reference | Related Articles | Metrics
Algorithm of biased skeleton trim based on intersecting cortical model
ZHOU Li HE Lin-yuan SUN Yi BI Du-yan GAO Shan
Journal of Computer Applications    2012, 32 (09): 2553-2555.   DOI: 10.3724/SP.J.1087.2012.02553
Abstract964)      PDF (610KB)(573)       Save
In order to solve the problem of geometric distortion and low efficiency in the process of biased skeleton trim, a new algorithm of biased skeleton trim based on intersecting cortical model was proposed. At first, according to inherent features of skeleton biased branch, definitions of endpoint and junction point were introduced and revised in the algorithm to accurately locate skeleton branch and biased branch. Then, with that information and the iteration number of intersecting cortical model, flameout condition of neurons spreading was set up. Finally, guided by that condition, the biased skeleton branch can be judged fast and trimmed accurately, with the aid of impulse dynamically generated by ignition neurons, which has biological nature of parallel transmission. Compared with conventional methods based on mathematical morphology, the experimental results show that the proposed algorithm has good performance in structural integrity of skeleton, as well as computation speed and anti-noise ability.
Reference | Related Articles | Metrics
Web resource recommendation method based on intuitive fuzzy clustering
XIAO Man-sheng WANG Xin-fang ZHOU Li-juan
Journal of Computer Applications    2012, 32 (09): 2480-2482.   DOI: 10.3724/SP.J.1087.2012.02480
Abstract1118)      PDF (687KB)(535)       Save
In the classification of the Web resources, a recommending method of Web resources based on intuitive fuzzy C-means clustering was proposed to solve the problem that the traditional method based on user interest cannot reflect the change of their interests accurately and the difficulty in distinguishing the quality and the style of content of resources. In the method, firstly, the Web resources were expressed as intuitive fuzzy data according to the user interest degree. Then the integrated theory of intuitive fuzzy information was applied to classify the resources. Lastly, the similar resources would be recommended to user successfully. Theoretical analysis and experimental results show that this method has a great advantage in improving the quality of recommendation compared with traditional fuzzy C-means and collaborative filtering method.
Reference | Related Articles | Metrics
HBase-based storage system for large-scale data in wireless sensor network
CHEN Qing-kui ZHOU Li-zhen
Journal of Computer Applications    2012, 32 (07): 1920-1923.   DOI: 10.3724/SP.J.1087.2012.01920
Abstract980)      PDF (769KB)(820)       Save
Wireless Sensor Network (WSN) spreads widely, and with the expansion of WSNs there will be a large number of sensors which produce massive sensor data. To store the data from large-scale WSNs, this paper proposed a two-tier storage architecture based on HBase storing sensor data from different regions and global data management directory, which achieved a near real-time storage system. The experimental results show that this system with high scalability, storage and query efficiency can solve the massive sensor data storage.
Reference | Related Articles | Metrics
Enhanced hierarchical mobile IPv6 based on route optimized communication
ZHOU Lian-ying
Journal of Computer Applications    2012, 32 (06): 1491-1494.   DOI: 10.3724/SP.J.1087.2012.01491
Abstract1089)      PDF (730KB)(597)       Save
In Hierarchical Mobile IPv6(HMIPv6), even though there is a shorter route between Correspondent Node(CN) and Mobile Node(MN),all the packets between them are still be forword by Mobility Anchor Point(MAP),it brings needless system's consumption and delay on packet transmission,and its binding update process of inter-region handover which conduct orderly has a certain delay on binding update.Therefore,an enhanced HMIPv6 is proposed,in which route optimized communication technology is applied when the number ratio of packet transmission to binding update reach a given threshold;Simultaneously, the binding update process of inter-domain handover is improved.A mathematical performance analysis proved that this enhanced HMIPv6 has advantages in reducing system’s consumption, delay on packet transmission and binding update in contrast to HMIPv6.
Related Articles | Metrics
Binarization algorithm for CCD wool images with weak contour
ZHOU Li BI Du-yan ZHA Yu-fei LUO Hong-kai HE Lin-yuan
Journal of Computer Applications    2012, 32 (04): 1133-1136.   DOI: 10.3724/SP.J.1087.2012.01133
Abstract940)      PDF (633KB)(418)       Save
In order to solve the distortion of wool geometric dimension, resulting from image binarization with weak contour, an automatic binarization algorithm for Charge-Coupled Device (CCD) wool image was proposed with reference to a ramp-width-reduction approach based on intensity and gradient indices, using a classical global threshold method and a local one. In that algorithm, edge-pixel-seeking step was added and gray-adjusting factor was improved, with sobel operator and ramp edge model introduced, to increase processing efficiency and avoid human intervention. Besides, every sub image was processed by the mixed global and local threshold based on the analysis of Otsus and Bernsens methods to intensify edge details and decrease distortion. Compared with the traditional ways, the experimental results show that the new algorithm has good performance in automatic binarization with weak contour.
Reference | Related Articles | Metrics
Parallel algorithm for computing coefficients of periodic B-spline basis functions
Kai-ting ZHOU Li-xin ZHENG Fu-yong LIN
Journal of Computer Applications    2011, 31 (07): 1800-1803.   DOI: 10.3724/SP.J.1087.2011.01800
Abstract1052)      PDF (506KB)(874)       Save
In the existing methods of periodic B-spline interpolation, coefficients of B-spline basis functions are determined by iterative algorithms. To overcome the weakness of the existing methods, new parallel algorithm for computing coefficients of B-spline basis functions were established. First, this paper established orthogonal B-spline basis and derived parallel algorithm for coefficients of orthogonal B-spline basis functions; and then derived parallel algorithm for coefficients of B-spline basis functions by using the relation between coefficients of orthogonal B-spline basis functions and coefficients of B-spline basis functions; at last this paper derived explicit formulas for both coefficients of B-spline basis functions and value of interpolated point with the 2nd, the 3rd and the 4th order periodic interpolating B-spline functions. The presented method retains the simplicity of B-spline basis functions while realizing fast parallel algorithm for coefficients of B-spline basis functions.
Reference | Related Articles | Metrics
Inverse control based on support vector machines and its stability analysis
Lu-zhou LIU Jian XIAO
Journal of Computer Applications   
Abstract2010)      PDF (450KB)(1281)       Save
Support Vector Machines (SVM) is a new machine learning method on Structural Risk Minimization (SRM) with good generability. The method of constructing inverse controller using SVM was given in this paper. A pseudo-linear compound system was formed when cascading with the inverse controller before the original system. The finite-gain stability of the controller was proved under the assumption that the kernel function was local Lipschitz in this open-loop inverse control system. The sufficient condition for that Gaussian kernel function to the local Lipschitz of either variable was given. The stability of the whole system was also proved under some proper assumptions.
Related Articles | Metrics
Effective bandwidth and measurement-based admission control algorithm for heavy traffic
GUI Zhi-bo,ZHOU Li-chao
Journal of Computer Applications    2005, 25 (09): 2098-2099.   DOI: 10.3724/SP.J.1087.2005.02098
Abstract908)      PDF (175KB)(912)       Save
Based on the TES(Transform Expand Sample) models of heavy traffics with certain burstiness,a new practical formula for computation of effective bandwidth of traffics was derived out.When the effective bandwidth computed by means of this formula was used as the declared bandwidth of the newly coming traffic,all the simulation results on the related measurement-based admission control(MBAC) algorithm showed that the better utilization of network could be achieved than the way of those MBAC algorithms that only used average rate or peak rate as the declared bandwidth for newly coming traffic.
Related Articles | Metrics
Survey on the research of focused crawling technique
ZHOU Li-zhu,LIN Ling
Journal of Computer Applications    2005, 25 (09): 1965-1969.   DOI: 10.3724/SP.J.1087.2005.01965
Abstract1720)      PDF (292KB)(2828)       Save
The survey of focused crawling starts with the motivation for this new research and an introduction on basic concepts of focused crawling.The key issues in focused crawling are reviewed,such as webpage analyzing algorithms and the searching strategy on the Web.How to crawl relevant data and information according to different requirements is discussed in detail and three representative architectures of focused crawler systems are analyzed.Some future works for focused crawling research are indicated,including crawling for data analysis and data mining,topic description,finding relevant Web pages,Web data cleaning,and the extension of search space.
Related Articles | Metrics
Improved clustering algorithm based on density and grid in the presence of obstacles
YAN Xin,ZHOU Li-hua,CHEN Ke-ping,XU Guang-yi
Journal of Computer Applications    2005, 25 (08): 1818-1820.   DOI: 10.3724/SP.J.1087.2005.01818
Abstract1314)      PDF (194KB)(864)       Save
An improved grid diffusant clustering algorithm in the presence of obstacles called DCellO1 was proposed. Based on grid,it combined density-based clustering algorithm with seed-filling algorithm of graphics. It could construct arbitrary shape clustering in the presence of obstacles, and could obtain good clustering results when the objects distributed unevenly. The experiments prove the superiority and effectiveness of DCellO1.
Related Articles | Metrics
Using Java technology to implement the SIP communication
YANG Peng,ZHAO Bo,WANG Kun,ZHOU Li-hua
Journal of Computer Applications    2005, 25 (02): 276-278.   DOI: 10.3724/SP.J.1087.2005.0276
Abstract1960)      PDF (130KB)(967)       Save

As the session controlling protocol of application-layer, SIP has the features of simple, expansibility and dilatancibility. At the basis of simple introduction of SIP protocol, the JAIN SIP exploring construction for the fulfillment of SIP communication of SUN Co was discussed in detail. To use Java language and take JAIN SIP as the core, all kinds of communication entity basic method in the fulfillment of SIP communication were described and simple model for SIP communication was built.

Related Articles | Metrics